Regularization Methods for Additive Models

نویسندگان

  • Marta Avalos
  • Yves Grandvalet
  • Christophe Ambroise
چکیده

This paper tackles the problem of model complexity in the context of additive models. Several methods have been proposed to estimate smoothing parameters, as well as to perform variable selection. Nevertheless, these procedures are inefficient or computationally expensive in high dimension. Also, the lasso technique has been adapted to additive models, however its experimental performance has not been analyzed. We propose a modified lasso for additive models, improving variable selection. A benchmark is also developed, to examine its practical behavior, comparing it with forward selection. Our simulation studies suggest ability to carry out model selection of the proposed method. The lasso technique shows up better than forward in the most complex situations. The computing time of modified lasso is considerably smaller since it does not depend on the number of relevant variables.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Regularization for High Dimensional Additive Models

We study the behavior of the l1 type of regularization for high dimensional additive models. Our results suggest remarkable similarities and differences between linear regression and additive models in high dimensional settings. In particular, our analysis indicates that, unlike in linear regression, l1 regularization does not yield optimal estimation for additive models of high dimensionality....

متن کامل

On High Dimensional Post-Regularization Prediction Intervals

This paper considers the construction of prediction intervals for future observations in high dimensional regression models. We propose a new approach to evaluate the uncertainty for estimating the mean parameter based on the widely-used penalization/regularization methods. The proposed method is then applied to construct prediction intervals for sparse linear models as well as sparse additive ...

متن کامل

Penalized estimation in additive varying coefficient models using grouped regularization

Additive varying coefficient models are a natural extension of multiple linear regression models, allowing the regression coefficients to be functions of other variables. Therefore these models are more flexible to model more complex dependencies in data structures. In this paper we consider the problem of selecting in an automatic way the significant variables among a large set of variables, w...

متن کامل

Smola The Entropy Regularization Information Criterion

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general li...

متن کامل

The Entropy Regularization Information Criterion

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general li...

متن کامل

Convex-constrained Sparse Additive Modeling and Its Extensions

Sparse additive modeling is a class of effective methods for performing high-dimensional nonparametric regression. In this work we show how shape constraints such as convexity/concavity and their extensions, can be integrated into additive models. The proposed sparse difference of convex additive models (SDCAM) can estimate most continuous functions without any a priori smoothness assumption. M...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003